Learning Metrics via Discriminant Kernels and Multidimensional Scaling: Toward Expected Euclidean Representation

نویسنده

  • Zhihua Zhang
چکیده

Distance-based methods in machine learning and pattern recognition have to rely on a metric distance between points in the input space. Instead of specifying a metric a priori, we seek to learn the metric from data via kernel methods and multidimensional scaling (MDS) techniques. Under the classification setting, we define discriminant kernels on the joint space of input and output spaces and present a specific family of discriminant kernels. This family of discriminant kernels is attractive because the induced metrics are Euclidean and Fisher separable, and MDS techniques can be used to find the lowdimensional Euclidean representations (also called feature vectors) of the induced metrics. Since the feature vectors incorporate information from both input points and their corresponding labels and they enjoy Fisher separability, they are appropriate to be used in distance-based classifiers.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Schoenberg Transformations in Data Analysis: Theory and Illustrations

The class of Schoenberg transformations, embedding Euclidean distances into higher dimensional Euclidean spaces, is presented, and derived from theorems on positive definite and conditionally negative definite matrices. Original results on the arc lengths, angles and curvature of the transformations are proposed, and visualized on artificial data sets by classical multidimensional scaling. A si...

متن کامل

Dimensionality Reduction via Euclidean Distance Embeddings

This report provides a mathematically thorough review and investigation of Metric Multidimensional scaling (MDS) through the analysis of Euclidean distances in input and output spaces. By combining a geometric approach with modern linear algebra and multivariate analysis, Metric MDS is viewed as a Euclidean distance embedding transformation that converts between coordinate and coordinate-free r...

متن کامل

A Note on Gradient Based Learning in Vector Quantization Using Differentiable Kernels for Hilbert and Banach Spaces

Supervised and unsupervised prototype based vector quantization frequently are proceeded in the Euclidean space. In the last years, also non-standard metrics became popular. For classification by support vector machines, Hilbert or Banach space representations are very successful based on so-called kernel metrics. In this paper we give the mathematical justification that gradient based learning...

متن کامل

Semisupervised learning from dissimilarity data

The following two-stage approach to learning from dissimilarity data is described: (1) embed both labeled and unlabeled objects in a Euclidean space; then (2) train a classifier on the labeled objects. The use of linear discriminant analysis for (2), which naturally invites the use of classical multidimensional scaling for (1), is emphasized. The choice of the dimension of the Euclidean space i...

متن کامل

A Fast $\mathcal{L}_p$ Spike Alignment Metric

The metrization of the space of neural responses is an ongoing research program seeking to find natural ways to describe, in geometrical terms, the sets of possible activities in the brain. One component of this program are the spike metrics, notions of distance between two spike trains recorded from a neuron. Alignment spike metrics work by identifying “equivalent” spikes in one train and the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003